iT邦幫忙

2024 iThome 鐵人賽

DAY 11
0
Mobile Development

手機Swift性能專家系列 第 11

ARkit Face Tracking基本應用

  • 分享至 

  • xImage
  •  

在原本是world的地方改成 ARFaceTrackingConfiguration()

    var sceneView: ARSCNView!
    sceneView = ARSCNView(frame: view.frame)
    view.addSubview(sceneView)

    let configuration = ARFaceTrackingConfiguration()
    session.run(configuration)
    session.delegate = self
    
    func updateFaceFeatures(for faceAnchor: ARFaceAnchor) {
    let blendShapes = faceAnchor.blendShapes
    
    // 偵測特定的面部表情
    if let smileLeft = blendShapes[.mouthSmileLeft]?.floatValue, let smileRight = blendShapes[.mouthSmileRight]?.floatValue {
        let isSmiling = (smileLeft + smileRight) / 2.0 > 0.5
        print("笑容偵測: \(isSmiling)")
    }
    
    if let eyeBlinkLeft = blendShapes[.eyeBlinkLeft]?.floatValue, let eyeBlinkRight = blendShapes[.eyeBlinkRight]?.floatValue {
        let isWinking = (eyeBlinkLeft > 0.5 || eyeBlinkRight > 0.5)
        print("眨眼偵測: \(isWinking)")
    }
    
}


接著在ARSessionDelegate裡增加

func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
    for anchor in anchors {
        guard let faceAnchor = anchor as? ARFaceAnchor else { continue }
        updateFaceFeatures(for: faceAnchor)
    }
}


上一篇
ARKitWorldTracking 基本應用
下一篇
ARkit 骨骼偵測
系列文
手機Swift性能專家30
圖片
  直播研討會
圖片
{{ item.channelVendor }} {{ item.webinarstarted }} |
{{ formatDate(item.duration) }}
直播中

尚未有邦友留言

立即登入留言